[MUSIC PLAYING] Technology is evolving faster and faster. Not surprising when you consider that today, more scientists are doing research than in all of history combined and using superior instruments and communication tools. New technologies like biogenetics, artificial intelligence, implants, and nanotechnology have advanced vastly in the past few decades. These various technologies seem now to be converging towards one goal, to overcome human limits and to create new, higher forms of life, to create something transhuman. Robert Anton Wilson has been calculating this acceleration of knowledge. He calls it the jumping Jesus phenomenon. The jumping Jesus phenomenon is my name for the acceleration of information throughout history. I first heard of that from Alfred Korszybski, a Polish mathematician, who invented a scientific discipline called general semantics. And Korszybski noted that information was doubling faster and faster every generation. And he said we've got to be prepared for more and more change. We've got to train ourselves to be less dogmatic and more flexible so we can deal with change. He took the unit at 1 AD as his basic unit to calculate how long it took for the information available to human beings to double. And it took 1,500 years, which brought us up to the time when Leonardo da Vinci was in his 40s and the Renaissance was at its height. I decided to call this unit a Jesus. So in 1 AD, we had one Jesus. In 1,500, we had two Jesus. The next doubling only took 250 years. Already, you can see the acceleration factor. And by 1,750, we had four Jesus. The next doubling took 150 years. And by 1,900, we had eight Jesus. The next doubling only took 50 years. And by 1950, we had 16 Jesus. By 1960, in only 10 years, we had 32 Jesus. By 1967, we had 64 Jesus. And by 1973, 128 Jesus. And the latest estimate I've seen by Dr. Jacques Rally, a computer scientist, is that knowledge is doubling every year. But I heard that, oh, about five, six years ago. I saw something on the net recently. Somebody estimated it's doubling about twice a year now. Obviously, if we're experiencing more change now in a year than we previously experienced in 1,000 years, we can propagate that trend into the future and see that a day will come when we will experience more change in an hour than we have experienced in the past 20, 30,000 years. A situation like that is unimaginable. So we call it a singularity, a place where the normal rules of modeling break down. Modern religions have anticipated the singularity by calling it the eschaton, or the end of time. Technological communities have anticipated the singularity by thinking in terms of artificial intelligences or something like that. In whatever form it takes, we seem to be on the cusp of a dramatic evolutionary leap into a deeper order of complexity than biology or biology plus culture has been able to provide. We're on the brink of something truly awesome and unknown. The world ends tomorrow. [INAUDIBLE] [INAUDIBLE] A future where knowledge evolves at infinite speed, it is clear that no regular human being will be able to keep up with this acceleration. [INAUDIBLE] Some futurologists anticipate that species of higher intelligence will at one point take over further progress. [INAUDIBLE] These might be artificial intelligences, genetically upgraded humans, or a combination of both. [INAUDIBLE] Most calculations anticipate this moment to be between 2035 and 2045, so it is not a far off science fiction scenario. [INAUDIBLE] How can we deal with such a future? [INAUDIBLE] Are you guys only worshipping Jesus or like God? It's like you're just saying that only Jesus can fill that place, and I'm saying that to every person there's an intricate savior. You know what I'm saying? It doesn't have to be just Jesus. How do we prepare for such a technocalypse? The singularity I put in 2045, at that point, the non-biological intelligence that we create that year will be about a billion times greater than all biological human intelligence. There are a growing number, maybe a few hundred people, who are seeing the writing on the wall that these technologies are coming this century. And they will allow humanity, if we, as human beings, as a species, if we choose-- and it's a critical concept-- if we choose, we could build god-like, massively intelligent machines with capacities, oh God, trillions of trillions of times above ours. I mean, they may reach a certain level of artificial intelligence, where they themselves then start redesigning themselves. The singularity, that idea, when a threshold level of intelligence is reached. And then it's no longer human beings who design the next generation. They do it. And they're doing it at the speed of light, electronically. So whoo! Up goes the AIQ very fast, and we just lose control. We just sit back and watch what happens. And they're the boss. Well, for a long time, I've been trying to understand why there is not more resistance to various technological agendas. The computer is an instrument that enables humans to be much more powerful than they ought to be. And so anything coming from the computer revolution, I would regard as dangerous, ultimately dangerous. In fact, I would say that within the next 20 to 30 years, we will see the catastrophe. One of the things about technological development, there's never a moment that we can, as I say, stop and evaluate, because everybody's saying, oh, wait, wait, wait, wait, it's going to get better. It's going to get better. Oh, yes, there are problems now. There are bugs now, but we will solve them. And so there's an infinite deferring of evaluations and decisions. And that's irrational. You ain't got much time left, because God said, Christ said if you don't cut the time short, there'll be no flesh left to inherit the earth. You're known as the mad scientist. The accelerating pace of innovation is not only stabilizing, but throws everyone really reeling on the defensive. Well, we call it stress. People are dropping like flies. Part of it is just the anxiety. Is this the way we want to live? Why not slow things down? What's the hurry? I'm not interested in a future where there are post-human beings if I'm not one of them. I want to be there. It's very important to me that I live and thrive in the future, not just to think about some robots taking off into space. I want to be there. They could take advantage of intelligence theory and wire up their circuitry in a way that make their mind Stein plus, plus, plus, plus. So they could be incredibly smart. They'll be like gods to us, human beings with our puny little brains. There are two things that are infinite, the universe and human stupidity. The human species is one that has a brain that's already too big. And now we have in the computer something that extends the power of that brain almost limitlessly. The machines are not the problem. It's the monkeys behind the machines that need to be addressed. We're all stupid. We don't think very well. We can't remember more than about seven numbers in a row before our brain gives up. We have very bad short-term memories. We can't think of long chains of reasoning without forgetting where we started. That's all design defects as far as I'm concerned. So is that an enhancement to improve that, or is it repairing a defect? It doesn't really matter. We should be using technology to make ourselves better. I am convinced that these technologists at MIT in all their fancy laboratories up there and artificial intelligence are correct when they say that the question in another 50 years will be whether the machines want to keep us around as pets or not. Who is to say that for whatever reason, sometime in the future, those artillex may decide, those human beings? Pests. They are so inferior to us. They get in the way. They're eating up our resources that we need, whatever, taking up our atoms that we need to go bigger and bigger and bigger. Well, I do worry, but I'm not overly concerned about the possibility that they will race ahead of us and, if we're lucky, treat us as pets. I think more likely what's going to happen is that we will become increasingly integrated functionally with our technology, with our computers, with our robots. And rather than it being a them and us situation where they're the ruling class and we're the poor surface underneath, I think we'll be much more integrated. Well, that is the most tricky question. Are we going to retain the monkey meat? Are we going to hang on to the body and through the body have a connection to the rest of animal nature? Or are we going to become disembodied streams of electrons moving in virtual realities that are contained entirely in circuitry? I think this will probably go both ways. There will be fundamentalists who want nothing to do with technological transformation. And there will be utopians who won't be able to get enough of it. This is probably the moral frontier where we each personally must make a stand. How much of the new technology and its reality redefining qualities do we want to take into our own lives? We will have cyborg bodies. We will have augmented bodies. We will automorph ourselves into whatever vehicle keeps us in existence for the longest period of time with the most pleasure, the least pain, and the most ease, the most elegance. Everything that you need. Everything. Spiritually, emotionally, financially. The post-biological folks will say, well, you people have a primitive, atavistic attachment to the species. Let it go. Let it go. Come with us. We're the bold ones. We'll move on to the-- that's really scary stuff to me. Really scary. A great number of people who I refer to as biological fundamentalists have a great fear of moving outside their biology, moving into other bodies, other forms for transport. These people have a lot of commitment, a lot of mythology, a lot of tradition and history attached to their bodies. Our sacred bodies. The house of our soul. I mean, I can imagine myself being a moderately modified cyborg. I mean, I'd like to have my arteries cleaned out, and I'd like to have better memory capacity, and I'd like to be able to learn Chinese in two seconds just by going, [SNAPS FINGERS] right? That would be great. But that, to me, is small beer. That's nothing in comparison to what these god-like-- I deliberately use that word, god-like. Artilex could be. Well, it's a philosophical issue as to whether this is still human. In my mind, it's definitely going beyond biology. But I don't define human as just biological. I mean, we're already taking steps beyond biology. There's not a single organ in the human body, including regions of the brain, where we're not already creating substitutes or extensions or augmentations. So if somebody has an artificial pancreas, are they not human? If they have a neural implant in their brain, are they not human? How about two neural implants? So maybe you can have up to 10, then you're human. But 11, you're not human anymore. If you have these nanobots, blood cell-sized robots in the brain that actually have computers interacting with your biological neurons, is that still a human? Well, one nanobot's probably OK. How about 500 million nanobots? I don't know how people are going to stop this. Although my hope is that when we effectively destroy half of human life on Earth through one technological mistake or another, or a combination of them, the diseases that we are allowing to come out now, or the poisons that we create, or the global warming, one or another of these things, destroy half the population on Earth, the other half will say, whoa, we've gone too far. What do we do now? Humanity will become extinct, but in the sense of a pseudo-extinction, in that it is clear that if we are going to be altering our own biology and transforming ourselves, that we are going to change. And we're going to change in enormous ways, in very unforeseen ways. And that means that Homo sapiens, as we know it now, will cease to exist. But that's got to be a form of madness. It's got to be a form of techno-blindness to want the end of the human species to transform it into a mechanical species. It's a kind of species side, the destruction of a species, against a kind of, I use the word deicide, a god killing. If you choose not to build these artillex, in a sense you're removing the possibility of creating gods. It's a kind of potential death of god, if you like. Now, which is worse? This is suicide. This is ecocide. I think it has begun to sink in, and there are many that would say, stop. We've got to draw a line there. But it can't be stopped, because this is a whole broad front that is moving forward. Genetics is one aspect of it. Computers are another aspect of it. Communications are another aspect of it. Everything is changing about life, and it's changing increasingly rapidly. Should we stop it? Should we sort of say, OK, AIQ, artificial intelligence quotient, at this level, we're going to legislate, perhaps even globally. That's the cutoff point. No robot, no artificial intelligence, no artilect, if you like, is allowed by law, by human law, to go beyond that. Do you think that's going to happen? I don't. I don't think that's going to happen at all. It's physical technology. He's saying what? Who can bring me down? There is no human, there is no social, there is no governmental control over what these technologies are doing and will do. How are you going to stop trillions of dollars industry in building these godlike machines? The robot industry, the future Microsofts and Intels and so on, in the 2030s, 2040s, I believe will be artificial brain based and just enormous. There may be enormous political rivalries for the dominant country between China and the US. Now, if that's the case, the development of this artificial intelligence will just keep growing. Even if it goes secret, even if the general population start to really rebel against it, as they get really upset. They are dirty. Why is this up? God always speak. Always. Happy as he has. But I think the major reason why it's not going to stop is about half of the population will be ideologically in favour of it going on. That's why the so-called white man put up himself as God. Saying what? If there has to be a God, I am. We are going to become gods. Period. If you don't like it, get off. You don't have to contribute. You don't have to participate. But if you're going to interfere with me becoming God, you're going to have big trouble. Then we'll have warfare. For one group, you'd be building gods. It's just amazing. It's awe-inspiring. It's energising. It's setting a goal for humanity. There's the whole universe out there. The big picture. And on the other hand, the potential risk of seeing the whole human species annihilated by these superior creatures who could just-- And you did what you wanted when you wanted. But you thought you were God. You thought you were God's family. You thought you were God's school. And they have trouble. The only way you can prevent me in this 50 year-- is to kill me. If you kill me, I'll kill you. I'm anticipating the most passionate war that has ever been. We're not talking about the survival of a tribe, or a people, or a country. We're talking about the survival of a whole species. So the passion levels will be extreme. I deserve hell. You deserve hell. You've got to see that you deserve the wrath and anger and vengeance. What I'm talking about is a renewed embrace of our existence. It's got problems. And it's finite. But I'm not driven to flight from it. We have to learn humility. Species humility. The more I think about that versus that, the more magnificent that becomes. Because its potential is just so much greater. A billion, billions of people get wiped out on the cosmic scale, as a cosmist, right? If you try to think in those, that's nothing compared to that. But if you can't see that, this is absolutely monstrous. You just don't even want to talk to them. Probably most people can anticipate horror. It's a fairly simple concept to understand. People getting killed. People get run over by cars all the time. You just multiply that by a billion, right? That's relatively easy to understand. That is much harder. It's very exciting, because something will happen. Without that, nothing will happen. And maybe we'll have a civilization that lasts for a million years and does the same thing every day. And then when the sun gets too hot or too cold, it goes away. So I don't see the value of anything that doesn't change. You're saying, in x number of thousands of years, the sun is going to explode. So we have to prepare for it now. And I say, well, OK, that's interesting. But how about preparing for the essentially toxic shock of our planet today? Or if you walk around, you see a lot of people with no housing, no place to live, not enough food. So the explosion of the sun can wait. Only a fool is going to pollute the water that his own damn kids have to drink. Only a fool is going to what? He's going to pollute the air that he has to breathe. This is a moment in time where, for the first time, we can truly imagine feeding everyone. I mean, this is something new, without precedence. We are able to have more humans. We are able to fight against diseases, childhood diseases, so that there can be more people, and they can live longer. But isn't this wonderful? Well, no. Of course it's not wonderful. That's exactly the problem. There are too many people, and they're living too long, and they're using up the resources of the Earth for the human species. One species out of a billion species is using up all of the resources for its own material betterment. [MUSIC PLAYING] Every moment that I speak, another species is being extinguished. We are moving into the unknown, and we see lots of possibilities for catastrophe. And the immediate reaction is, well, we should stop development in some way. And yet, we can't. We're on this raceway. We can't stop it, because if we try to stop it, it will be completely unsustainable. The human population has ridden into a magnitude that could not be sustained without technology. [ROARING] [MUSIC PLAYING] Well, I can tell them what we do now. We establish small communities with small technologies. It's very simple. [MUSIC PLAYING] Just as we have today the Amish, who want to stabilize their culture at a late 19th century level of technology, I can imagine the Amish who want to stabilize their culture at a late 19th century level of technology. I can imagine a group-- I like to call them the humanish-- who might want to stay merely human and say, OK, we'll keep our internet connections, but let's not have any brain plugs. Let's not upgrade our intelligence. Let's just keep things as they are today. That's enough. And if they want to do that, fine. I think they should be quite able to do that. But most of us, I think, will see the advantages of living longer, getting smarter, refining our emotions and our personality. So if the thunder don't get you, then the lightning will. We've been traveling this technological path for 15,000 years. It's a little late to now rear up and try and turn in a new direction. Our hands, our minds were made for the manipulation of matter and the creation of technology. So I embrace it. I think the future is marvelously exciting and bright. We have all these devices. You're a technophobe. You're a Luddite. To silence and dismiss any and all criticism, not only as irrelevant and irrational, but also as irreverent. And if you're critical, you're not being reverent enough. Why is it one of my life goals to raise the alarm, to try to generate this debate? And I don't know if it's a simple answer, but one answer is because I'm so ambivalent. See, if I were 100% Cosmos, I would just quietly do my work and not say anything. Just do it. Because I want the technology to be there. And the faster it's there, less climbing time. So more likely that these artilects would be built quietly. And then suddenly, fait accompli, they're just there. I think we need to downplay the caution and move ahead faster. The debate will heat up and rage. And then people take side. And then you start getting assassinations and sabotage. Giga death. The worst war that we've ever seen. We have to grow up as a people. And the faster we do that, the better it will be for us all. I just feel profoundly schizophrenic on this issue. Torn. Because it's so horrible and it's so magnificent. Sorry. [INAUDIBLE] The Lord is near. And it's near. [MUSIC PLAYING] [CHANTING] [MUSIC PLAYING] [CHANTING] [MUSIC PLAYING] [CHANTING] [MUSIC PLAYING] [CHANTING] [MUSIC PLAYING] [CHANTING] [MUSIC PLAYING] [CHANTING] [MUSIC PLAYING] [CHANTING] [MUSIC PLAYING] [CHANTING] [MUSIC PLAYING] [CHANTING] [MUSIC PLAYING] I don't see that we are going to go back. You don't have that choice. That we're going to be able to stop this. [MUSIC PLAYING] [CHANTING] Are we going to deny the creation of all this magnificence? You don't have that choice. All this incredible stuff that may be coming in the future. And you don't even deserve it. That's a tragedy. That would be horrible if we did that. [MUSIC PLAYING] They could go anywhere. They would be immortal. They would be immortal. You will be joyful. You can live with God forever and ever and ever and ever and ever. It's good. It's very curious. That is, we're not able to be rational about this presumably most rational of activities. If nature is holy, anti-nature technology must be the devil. God, you're the devil. God, you're the devil. I got a message from God, you hear me? I got a message from God to you all. [MUSIC PLAYING] Now the most pressing issue is aging and death. And human stupidity, which makes it so hard to solve problems. So to me, now all the incentive is towards speeding up development. [MUSIC PLAYING] The record is clear, the historical record, that human beings are not capable of safely and reliably using that technology. The potential for mischief is enormous. And unless human beings somehow reinvent themselves-- Look, I think the solution for us today is the same as it was for the Luddites. And the Luddites took a hammer like this, a big 10 pound hammer, and they smashed the machines that bothered them. And this is a machine, I think, that bothers me and bothers our life and will destroy our life. Here's what we should do. [GUNSHOTS] Since 1978, the Unabomber has struck 16 times, killing three people and injuring 23. This may be what he looks like. Sir, are you the Unabomber? Sir, are you the Unabomber? Do you have anything to say at all? No. Would you share it to make a statement, sir? Are you the Unabomber? [GUNSHOTS] Theodore Kaczynski was the Unabomber. He was a brilliant mathematician and assistant professor at the University of California. After two years, he resigned without any explanation and went to live in a small cabin in the woods. There, he proceeded to form his personal terrorist group, send letter bombs to scientists and propagators of technology, and become US public enemy number one, the bin Laden of the '80s and '90s. In his manifesto, published by some major newspapers, he fulminates against modern society. He says that the proponents of modern technology are naive in their optimism. If scientists succeed in making intelligent machines, he says, the fate of the human race will be at the mercy of the machines. Kaczynski maintains that the bad and good aspects of the industrial technological society cannot be separated. Therefore, he declared revolution on modern society. The destruction of the system must be the revolutionary's only goal. The factories should be destroyed, technical books burned. The alternative proposed by Kaczynski is going back to nature. He calls it wild nature, because he understands that this also means giving up social justice, modern health care, and welfare. I'm convinced that what the Unabomber was saying, a lot of people think. He took it another step further. And as he says in the manifesto, in order to be heard, I had to kill people. Or we, because the manifesto was written as if it were a group, we had to kill people. Well, that's a horrible thought, but it's also true. How do we create the space and the attention to these issues? Kaczynski did it by killing people. Well, maybe there are better ways of doing that. I don't know. What can we expect in the coming decades? The prospects of technology appeal to people's deepest fears, as well as to their highest hopes. A rational debate about the future seems no longer possible. In fact, such discussion, by definition, is beyond our human capacities. Aren't we still too human to be prepared for becoming transhuman? It is an enormous privilege to be alive right now and to be a part of this movement. And it's a very difficult time. I don't deny that at all. I look at it more as a birth pain, that here is this change that's occurring. And we're coming down the birth canal. And there's blood around. And it's a difficult process. But it's a birth that is occurring. It's not a death. And I think that when future humans look back on this era from several hundred years in the future, they will not remember this time as one when we squandered the resources of the planet, when we destroyed our planet, essentially. They'll look at this as an extraordinary moment in time when we developed the basic technologies, the basis for their societies. We developed genetic engineering, the ability to basically rework our own genetic blueprint, which is something unprecedented. It's never happened in 3 and 1/2 billion years of history. We began to move out into space. For 3 and 1/2 billion years, life, all of life, has been constrained to this thin film on the surface of the planet. And now, quite suddenly, in just an instant in time, we're moving out towards the universe. And the third thing that has occurred in our era is artificial intelligence, computers. Suddenly, non-living material is beginning to achieve a level of complexity that rivals that of life itself. So these are three things that are absolutely unprecedented in the history of life. And they're occurring now. And is it any surprise that this is something that is jarring and shaking up the environment and causing extinctions? This is going to provide the basis not just for the next hundred years, but for thousands of years, tens of thousands, if not millions of years. And I see a long future stretching out ahead. Some people are already preparing to become part of this transhuman era. One of them is Terence McKenna, an ethnobotanist who studied shamanism in the Amazons. According to McKenna, to get ready for a future life where we will live uploaded in computers, we must first prepare ourselves mentally. The best way to do so, he believes, is through the use of psychedelic drugs. In a sense, this historical crisis or this singularity that we're approaching is like a transition from a low dimensional world, say a world of two or three dimensions, to a world of four, five, or six dimensions. This is what I believe actually happens to a human brain-mind system under the influence of psychedelics. So in a way, the best practice for the approaching singularity is the repeated dissolving and reconstituting of one's personality through the use of psychedelic substances. This is one of the most interesting new psychedelics in the world. This is salvia divinorum. And it is definitely one of the plants which will shape the next few decades of the new millennium. This is a coleus. It's ironic that these plants, which have been in our kitchens and in our windowsill flower beds for generations, turn out to contain psychoactive compounds as powerful as any known to science. These are not particularly interesting in terms of drugs, but they're certainly bizarre. When I take psychedelics, I always do it in a shamanic style, usually at night, usually alone, in nature, if possible, and then I watch. I pay very close attention. I use my mind as an alchemical vessel for carrying out observations on the union of spirit, mind, and spirit. My spirit, my personality, and matter, the physical matter of the substance that I'm ingesting. Nothing in human experience is as much like the singularity as a psychedelic experience. In a way, it's a microcosmic anticipation of this macrocosmic event in history. When we take psychedelics, we undergo a mini-apocalypse, a mini-revelation, and it positions us, then, for these larger events in the historical time stream. If you'd like to climb up here, we may. This is one of the most interesting plants in the garden. This is Cicotria viridis. This is the plant which causes the vision. When taken with ayahuasca, when taken as a liquid, the experience lasts about four to six hours. It's not as intense as smoking it. Smoking it is the most intense experience, this side of the yawning grave. When you can see the future, it is a very difficult thing to deal with the idea that we are, in fact, one of the last generations that will live a normal human lifespan. And there's a tendency for us, therefore, to want these things to occur more rapidly than they may actually occur. This is the number one intellectual challenge. Namely, you get to rejuvenate yourself. Now, how much is that worth? That's worth a great deal of money. As a matter of fact, when you get people my age, it's worth all the money that they have. I don't think people live long enough right now. It takes 50 years to understand modern mathematics. We're into life extension. Bottom line, we don't want to die. It's not a question of will we, it's a question of when will we have in our hands the interventions at the molecular level, at the genetic level, that will truly extend the human lifespan, perhaps hundreds of years into the future. The only thing that you have to do is stay alive. That is all you have to do. I'm not sure what methods are actually going to work. But if you and I are going to live until those methods become available, we had better do everything that we can to protect ourselves by making sure that we have a healthy lifestyle. And that includes eating the proper foods, taking your vitamins and mineral supplements. And it includes exercise. And everybody needs to have approximately a half hour a day of exercise. And stress reduction, because stress is another way that you can certainly reduce your lifespan. Doing those three things in different ways are a critical part of what I call the bridge plan, the plan or recommendations that you need to adopt in your life in order to bridge yourself from where you are now until the time in which all of these wonderful scientific medical interventions will become possible. Some people say, if we live for 1,000 years or even 200 years, we will have nothing to do. We will get bored. And it will be pointless. And I say, well, first of all, how do we know? Maybe we'll get bored. Maybe we won't. If we do, we can always just stop taking the medicines. Or we can jump off a cliff or whatever. But how do we know? We should do the experiment. But the second thing that I say is, well, subjectively, if I look at myself, how could I possibly ever get bored of punting? I mean, what can be better? What can possibly be better in life than just strolling down the river with a beer in one hand and a pole in the other hand? I could live for a million years, and I would never get bored of doing this. Of course, some people will not be able to live long enough to benefit from these therapies. Some people who are already much older than me, for example, much older than you, have no chance, really. I am 42. I think I have some chance of living long enough to live forever. But I have a good chance that the science will go more slowly, and so it will not be good enough. So I think that anyone who is interested in living a long time should think very seriously about cryonic preservation. Here's our panics. Cryonics. There's experimentation now with the possibility that a body might be frozen and revived. If the frozen traveler awakes, what will he find? The advance of science tempts us to speculate on the nature of the world which lies ahead. They told a little bit about the basic idea of cryonics, and it just seemed like a great idea. I think it's pretty cool. Like, you die, and you come back, and you see whatever happened in the future. For me, I feel like it's just a waste. Maybe I'm a woman, so that's the way. Cryonics is a procedure by which people are frozen shortly after death in the hopes that the information that is encoded in the brain will be usable in order to reconstitute the person sometime in the future. [INTERPOSING VOICES] Some people currently only have their head suspended. It's a cheaper process. It takes less nitrogen boil off to maintain such a thing. But there could be some arguments that they're losing some part of what they are. It is their belief that future science will be able to develop maybe a new body or an android or something. More and more people only go for the head, not just because it's cheaper. The process is also easier to control. However, how to recover your body afterwards still remains an open question. But there are many examples in nature to make it seem possible. Some animals, like lizards, have the ability to regenerate a limb after it is lost. But in humans also, when a reasonably small part of the liver is cut off, it can grow back. Or when children before puberty lose a part of their finger, the finger can grow back to normal if treated well. This indicates that this kind of regeneration is not impossible. It's not impossible because it's being done by nature right now. So if we can simply find those secrets, the secret of the salamander, and be able to find how the salamander regenerates, then we may be able to apply the same kinds of amphibian techniques to human beings. Other possibilities. We may have vastly improved bodies. They may be made of metal. They may be much different than the bodies that you're looking at right here. However, the mind would be the same. And with this mind in a metal body, I might be able to walk around on some of the heavy planets, on Uranus, for example, or on Jupiter. So that is a possibility that's suggested. Another possibility is simply living inside a computer to where we're living in a virtual world, a virtual reality. And if that's the case, there's certainly no problem with space. We'd be able to cram the entire human race into that computer over there. Anyone making this trip knows that it may be a trip that goes on forever. Anyone making this trip knows that it may be the fact that every single person in suspension, including this dear lady, may never be revived. It's a crapshoot, but we're the only game in town. Ready? [SCREAMING] -I feel fine. [SCREAMING] -I feel fine. [SCREAMING] [SCREAMING] [SCREAMING] [SCREAMING] [SCREAMING] -I feel fine. You guys are not going to get to see the very bottom tanks. Nah, it's the biological. I think that's the biological. I can tell by the weight. Sorry. Not there. It seems obvious this is the way to go. [MUSIC PLAYING] -Cryonics has some religious aspects in that our heaven, in a way, is the future. And our god, in a way, is technology. [MUSIC PLAYING] -One of our associates suggests a program where you send a bunch of volunteers into the rainforest. And the volunteers are not trained people. They cannot tell an endangered plant or animal from anything else. But what they do is they do grab-the-bag fashion. They have shovels, and they grab some of this. And they see a plant over here, they grab that. And they get all this kind of stuff. And 99.9% of this is going to be junk. But 1% may be very valuable. It may be a plant or an animal from the rainforest or some other environment which has previously not been discovered and which will be wiped out unless it's saved right now and saved in the best way that we can. I would love to have whole ecosystems stored in some of the vaults that the American Crayon Society is constructing in out-of-the-way places in order to time machine forward some of the library of life from the 20th century. [MUSIC PLAYING] [MUSIC PLAYING] [MUSIC PLAYING] [MUSIC PLAYING] [MUSIC PLAYING] [MUSIC PLAYING] [ Music ] {END} Wait Time : 0.00 sec Model Load: 0.65 sec Decoding : 3.93 sec Transcribe: 3183.35 sec Total Time: 3187.93 sec